Representing inferential uncertainty in deep neural networks through sampling
نویسندگان
چکیده
As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modeling uncertainty is one of the key features of Bayesian methods. Scalable Bayesian DNNs that use dropout-based variational distributions have recently been proposed. Here we evaluate the ability of Bayesian DNNs trained with Bernoulli or Gaussian distributions over units (dropout) or weights (dropconnect) to represent their own uncertainty at the time of inference through sampling. We tested how well Bayesian fully connected and convolutional DNNs represented their own uncertainty in classifying the MNIST handwritten digits. By adding different levels of Gaussian noise to the test images, we assessed how DNNs represented their uncertainty about regions of input space not covered by the training set. Bayesian DNNs estimated their own uncertainty more accurately than traditional DNNs with a softmax output. These results are important for building better deep learning systems and for investigating the hypothesis that biological neural networks use sampling to represent uncertainty.
منابع مشابه
Robustly representing uncertainty through sampling in deep neural networks
As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modelling uncertainty is one of the key features of Bayesian methods. Using Bernoulli dropout with sampling at prediction time has recently been proposed as an efficient and well performing variational inference method for DNNs. However, sampling from ot...
متن کاملUncertainty Quality | Uncertainty in Deep Learning
In this chapter we assess the techniques developed in the previous chapters, concentrating on questions such as what our model uncertainty looks like. We experiment with different model architectures and approximating distributions, and use various regression and classification settings. Assessing the models’ confidence quantitatively we can see how much we sacrifice in our attempt at deriving ...
متن کاملEfficient exploration with Double Uncertain Value Networks
This paper studies directed exploration for reinforcement learning agents by tracking uncertainty about the value of each available action. We identify two sources of uncertainty that are relevant for exploration. The first originates from limited data (parametric uncertainty), while the second originates from the distribution of the returns (return uncertainty). We identify methods to learn th...
متن کاملDropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Deep learning has gained tremendous attention in applied machine learning. However such tools for regression and classification do not capture model uncertainty. Bayesian models offer a mathematically grounded framework to reason about model uncertainty, but usually come with a prohibitive computational cost. We show that dropout in neural networks (NNs) can be cast as a Bayesian approximation....
متن کاملCystoscopy Image Classication Using Deep Convolutional Neural Networks
In the past three decades, the use of smart methods in medical diagnostic systems has attractedthe attention of many researchers. However, no smart activity has been provided in the eld ofmedical image processing for diagnosis of bladder cancer through cystoscopy images despite the highprevalence in the world. In this paper, two well-known convolutional neural networks (CNNs) ...
متن کامل